Smooth Diagonal Weighted Newton Support Vector Machine
نویسندگان
چکیده
منابع مشابه
Weighted Twin Support Vector Machine with Universum
Universum is a new concept proposed recently, which is defined to be the sample that does not belong to any classes concerned. Support Vector Machine with Universum (U-SVM) is a new algorithm, which can exploit Universum samples to improve the classification performance of SVM. In fact, samples in the different positions have different effects on the bound function. Then, we propose a weighted ...
متن کاملFinite Newton method for Lagrangian support vector machine classification
An implicit Lagrangian [18] formulation of a support vector machine classifier that led to a highly effective iterative scheme [17] is solved here by a finite Newton method. The proposed method, which is extremely fast and terminates in 6 or 7 iterations, can handle classification problems in very high dimensional spaces, e.g. over 28,000, in a few seconds on a 400 MHz Pentium II machine. The m...
متن کاملA Feature Selection Newton Method for Support Vector Machine Classification
A fast Newton method, that suppresses input space features, is proposed for a linear programming formulation of support vector machine classifiers. The proposed stand-alone method can handle classification problems in very high dimensional spaces, such as 28,032 dimensions, and generates a classifier that depends on very few input features, such as 7 out of the original 28,032. The method can a...
متن کاملA Weighted Support Vector Machine for Data Classification
This paper presents a weighted support vector machine (WSVM) to improve the outlier sensitivity problem of standard support vector machine (SVM) for two-class data classification. The basic idea is to assign different weights to different data points such that the WSVM training algorithm learns the decision surface according to the relative importance of data points in the training data set. Th...
متن کاملA Weighted Least Squares Twin Support Vector Machine
Least squares twin support vector machine (LS-TSVM) aims at resolving a pair of smaller-sized quadratic programming problems (QPPs) instead of a single large one as in the conventional least squares support vector machine (LS-SVM), which makes the learning speed of LS-TSVM faster than that of LS-SVM. However, same penalties are given to the negative samples when constructing the hyper-plane for...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Problems in Engineering
سال: 2013
ISSN: 1024-123X,1563-5147
DOI: 10.1155/2013/349120